Suppressing the Unusual: towards Robust CNNs using Symmetric Activation Functions
نویسندگان
چکیده
Many deep Convolutional Neural Networks (CNN) make incorrect predictions on adversarial samples obtained by imperceptible perturbations of clean samples. We hypothesize that this is caused by a failure to suppress unusual signals within network layers. As remedy we propose the use of Symmetric Activation Functions (SAF) in non-linear signal transducer units. These units suppress signals of exceptional magnitude. We prove that SAF networks can perform classification tasks to arbitrary precision in a simplified situation. In practice, rather than use SAFs alone, we add them into CNNs to improve their robustness. The modified CNNs can be easily trained using popular strategies with the moderate training load. Our experiments on MNIST and CIFAR-10 show that the modified CNNs perform similarly to plain ones on clean samples, and are remarkably more robust against adversarial and nonsense samples.
منابع مشابه
An Information-Theoretic Discussion of Convolutional Bottleneck Features for Robust Speech Recognition
Convolutional Neural Networks (CNNs) have been shown their performance in speech recognition systems for extracting features, and also acoustic modeling. In addition, CNNs have been used for robust speech recognition and competitive results have been reported. Convolutive Bottleneck Network (CBN) is a kind of CNNs which has a bottleneck layer among its fully connected layers. The bottleneck fea...
متن کاملComplete stability for a Class of Cellular Neural Networks
In this paper, the dynamical behavior of a class of third-order competitive cellular neural networks (CNNs) depending on two parameters, is studied. The class contains a one-parameter family of symmetric CNNs, which are known to be completely stable. The main result is that it is a generic property within the family of symmetric CNNs that complete stability is robust with respect to (small) non...
متن کاملCoefficient Estimates for a General Subclass of m-fold Symmetric Bi-univalent Functions by Using Faber Polynomials
In the present paper, we introduce a new subclass H∑m (λ,β)of the m-fold symmetric bi-univalent functions. Also, we find the estimates of the Taylor-Maclaurin initial coefficients |am+1| , |a2m+1| and general coefficients |amk+1| (k ≥ 2) for functions in this new subclass. The results presented in this paper would generalize and improve some recent works of several earlier authors.
متن کاملCompressed Image Hashing using Minimum Magnitude CSLBP
Image hashing allows compression, enhancement or other signal processing operations on digital images which are usually acceptable manipulations. Whereas, cryptographic hash functions are very sensitive to even single bit changes in image. Image hashing is a sum of important quality features in quantized form. In this paper, we proposed a novel image hashing algorithm for authentication which i...
متن کاملElementary Linear Filtering Tasks Using CNNs With Minimum-Size Templates
Technical University of Iaşi, Bd. Carol 11, 6600 Iaşi, Romania Abstract In this paper we investigate the linear filtering capabilities of the standard cellular neural network in the general case of non-symmetric templates. We approached here systematically the CNNs with minimum-size templates (1 3 × ), analyzing in detail their filtering capabilities in the onedimensional case. Starting from a ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1603.05145 شماره
صفحات -
تاریخ انتشار 2016